Statistical Learning Theory Instructors : R . Castro and A . Singh Lecture 17 : Minimax Lower Bounds Lower Performance Bounds

نویسنده

  • A. Singh
چکیده

* An observation model, Pf , indexed by f ∈ F . Pf denotes the distribution of the data under model f . E.g. In regression and classification, this is the distribution of Z = (X1, Y1, . . . , Xn, Yn) ⊆ Z. We will assume that Pf is a probability measure on the measurable space (Z,B). * A performance metric d(., .). ≥ 0. If you have a model estimate f̂n, then the performance of that model estimate relative to the true model f is d(f̂n, f). E.g.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Localized Upper and Lower Bounds for Some Estimation Problems

We derive upper and lower bounds for some statistical estimation problems. The upper bounds are established for the Gibbs algorithm. The lower bounds, applicable for all statistical estimators, match the obtained upper bounds for various problems. Moreover, our framework can be regarded as a natural generalization of the standard minimax framework, in that we allow the performance of the estima...

متن کامل

On Bayes Risk Lower Bounds

This paper provides a general technique for lower bounding the Bayes risk of statistical estimation, applicable to arbitrary loss functions and arbitrary prior distributions. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk, but also characterizes the fundamental limit of any estimator given the prior knowledge. Our bounds are based on the notion of f -inform...

متن کامل

Tight Lower Bounds for Homology Inference

The homology groups of a manifold are important topological invariants that provide an algebraic summary of the manifold. These groups contain rich topological information, for instance, about the connected components, holes, tunnels and sometimes the dimension of the manifold. In earlier work [1], we have considered the statistical problem of estimating the homology of a manifold from noiseles...

متن کامل

A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation

In statistical inference problems, we wish to obtain lower bounds on the minimax risk, that is to bound the performance of any possible estimator. A standard technique to do this involves the use of Fano’s inequality. However, recent work in an information-theoretic setting has shown that an argument based on binary hypothesis testing gives tighter converse results (error lower bounds) than Fan...

متن کامل

Minimax Theory for High-dimensional Gaussian Mixtures with Sparse Mean Separation

While several papers have investigated computationally and statistically efficient methods for learning Gaussian mixtures, precise minimax bounds for their statistical performance as well as fundamental limits in high-dimensional settings are not well-understood. In this paper, we provide precise information theoretic bounds on the clustering accuracy and sample complexity of learning a mixture...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007